Asymptotic Properties of Two Time-Scale Stochastic Approximation Algorithms with Constant Step Sizes
نویسندگان
چکیده
Asymptotic properties of two time-scale stochastic approximation algorithms with constant step sizes are analyzed in this paper. The analysis is carried out for the algorithms with additive noise, as well as for the algorithms with non-additive noise. The algorithms with additive noise are considered for the case where the noise is state-dependent and admits the decomposition as a sum of a martingale difference sequence and a telescoping sequence. The algorithms with non-additive noise are analyzed for the case where the noise satisfies uniform or strong mixing conditions, as well as for the case where the noise is a Markov chain controlled by the algorithm states. Acknowledgement This paper is based upon work supported by the National Science Foundation under Award No. DMI 00 85165. Any opinions, findings, and conclusions or recommendations expressed in this publication are those of the authors and do not necessarily reflect the views of the National Science Foundation.
منابع مشابه
Iterate-averaging sign algorithms for adaptive filtering with applications to blind multiuser detection
Motivated by the recent developments on iterate averaging of recursive stochastic approximation algorithms and asymptotic analysis of sign-error algorithms for adaptive filtering, this work develops two-stage sign algorithms for adaptive filtering. The proposed algorithms are based on constructions of a sequence of estimates using large step sizes followed by iterate averaging. Our main effort ...
متن کاملConstant Step Size Least-Mean-Square: Bias-Variance Trade-offs and Optimal Sampling Distributions
We consider the least-squares regression problem and provide a detailed asymptotic analysis of the performance of averaged constant-step-size stochastic gradient descent (a.k.a. least-mean-squares). In the strongly-convex case, we provide an asymptotic expansion up to explicit exponentially decaying terms. Our analysis leads to new insights into stochastic approximation algorithms: (a) it gives...
متن کاملRegime Switching Stochastic Approximation Algorithms with Application to Adaptive Discrete Stochastic Optimization
This work is devoted to a class of stochastic approximation problems with regime switching modulated by a discrete-time Markov chain. Our motivation stems from using stochastic recursive algorithms for tracking Markovian parameters such as those in spreading code optimization in CDMA (code division multiple access) wireless communication. The algorithm uses constant step size to update the incr...
متن کاملConvergence Rate and Averaging of Nonlinear Two - Time - Scale Stochastic Approximation Algorithms
The first aim of this paper is to establish the weak convergence rate of nonlinear two-time-scale stochastic approximation algorithms. Its second aim is to introduce the averaging principle in the context of two-time-scale stochastic approximation algorithms. We first define the notion of asymptotic efficiency in this framework, then introduce the averaged two-time-scale stochastic approximatio...
متن کاملAveraged Least-Mean-Squares: Bias-Variance Trade-offs and Optimal Sampling Distributions
We consider the least-squares regression problem and provide a detailed asymptotic analysis of the performance of averaged constant-step-size stochastic gradient descent. In the strongly-convex case, we provide an asymptotic expansion up to explicit exponentially decaying terms. Our analysis leads to new insights into stochastic approximation algorithms: (a) it gives a tighter bound on the allo...
متن کامل